FLAG: Fast Linearly-Coupled Adaptive Gradient Method

نویسندگان

  • Xiang Cheng
  • Farbod Roosta-Khorasani
  • Peter L. Bartlett
  • Michael W. Mahoney
چکیده

The celebrated Nesterov’s accelerated gradient method offers great speed-ups compared to the classical gradient descend method as it attains the optimal first-order oracle complexity for smooth convex optimization. On the other hand, the popular AdaGrad algorithm competes with mirror descent under the best regularizer by adaptively scaling the gradient. Recently, it has been shown that the accelerated gradient descent can be viewed as a linear combination of gradient descent and mirror descent steps. Here, we draw upon these ideas and present a fast linearly-coupled adaptive gradient method (FLAG) as an accelerated version of AdaGrad, and show that our algorithm can indeed offer the best of both worlds. Like Nesterov’s accelerated algorithm and its proximal variant, FISTA, our method has a convergence rate of 1/T 2 after T iterations. Like AdaGrad our method adaptively chooses a regularizer, in a way that performs almost as well as the best choice of regularizer in hindsight.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Linearly Constrained Adaptive Filtering Algorithms Designed Using Control Liapunov Functions

The standard conjugate gradient (CG) method uses orthogonality of the residues to simplify the formulas for the parameters necessary for convergence. In adaptive filtering, the sample-by-sample update of the correlation matrix and the cross-correlation vector causes a loss of the residue orthogonality in a modified online algorithm, which, in turn, results in loss of convergence and an increase...

متن کامل

A Direction Set Based Algorithm for Adaptive Least Squares Problems in Signal Processing

A fast algorithm, called the direction set based algorithm, has been developed recently for solving a class of adaptive least squares problems arising in signal processing. The algorithm is based on the direction set method developed by Powell and Zangwill for solving unconstrained minimization problems without using derivatives. It is designed so as to fully take advantage of the special struc...

متن کامل

A Random Coordinate Descent Method on Large-scale Optimization Problems with Linear Constraints

In this paper we develop a random block coordinate descent method for minimizing large-scale convex problems with linearly coupled constraints and prove that it obtains in expectation an ε-accurate solution in at most O( 1 ε ) iterations. However, the numerical complexity per iteration of the new method is usually much cheaper than that of methods based on full gradient information. We focus on...

متن کامل

Image Enhancement Using an Adaptive Un-sharp Masking Method Considering the Gradient Variation

Technical limitations in image capturing usually impose defective, such as contrast degradation. There are different approaches to improve the contrast of an image. Among the exiting approaches, un-sharp masking is a popular method due to its simplicity in implementation and computation. There is an important parameter in un-sharp masking, named gain factor, which affects the quality of the enh...

متن کامل

An Optimal Control Modification to Model-Reference Adaptive Control for Fast Adaptation

This paper presents a method that can achieve fast adaptation for a class of model-reference adaptive control. It is well-known that standard model-reference adaptive control exhibits high-gain control behaviors when a large adaptive gain is used to achieve fast adaptation in order to reduce tracking error rapidly. Highgain control creates high-frequency oscillations that can excite unmodeled d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1605.08108  شماره 

صفحات  -

تاریخ انتشار 2016